# Chinese Text Filling
Chinese Roberta L 12 H 768
Chinese pre-trained language model based on RoBERTa architecture, with hidden layer dimension of 512 and 8 Transformer layers
Large Language Model Chinese
C
uer
419
13
Chinese Roberta L 2 H 128
This is a Chinese RoBERTa medium model pre-trained on CLUECorpusSmall, featuring 8 layers and 512-dimensional hidden layers, suitable for various Chinese natural language processing tasks.
Large Language Model Chinese
C
uer
1,141
11
Albert Large Chinese Cluecorpussmall
Chinese ALBERT model pre-trained using the UER-py framework, trained on CLUECorpusSmall corpus, suitable for Chinese text processing tasks.
Large Language Model
Transformers Chinese

A
uer
17
4
Featured Recommended AI Models